Welcome![Sign In][Sign Up]
Location:
Search - C4.5 algorithm

Search list

[AI-NN-PRdataming

Description: 介绍数据挖掘的10种主要算法及其应用 一种透过数理模式来分析企业内储存的大量资料,以找出不同的客户或市场划分,分析出消费者喜好和行为的方法。 -Top 10 algorithms in data mining his paper presents the top 10 data mining algorithms identified by the IEEE International Conference on Data Mining (ICDM) in December 2006: C4.5,k-Means, SVM, Apriori, EM, PageRank, AdaBoost,kNN, Naive Bayes, and CART. These top 10 algorithms are among the most influential data mining algorithms in the research community. With each algorithm, we provide a description of the algorithm, discuss the impact of the algorithm, and review current and further research on the algorithm. These 10 algorithms cover classification,
Platform: | Size: 633856 | Author: andyzygg | Hits:

[AI-NN-PRDecisionTree

Description: 决策树的经典C4.5算法,基于VS2010,对学习人工智能的同学有帮助-C4.5 decision tree algorithm, based on VS2010
Platform: | Size: 4885504 | Author: 凌遥雪 | Hits:

[AI-NN-PRmachine-learning-2

Description: 机器学习算法之C4.5与CART,经典的机器学习的外文资料,该资料描述详细,便于大家的学习。-The machine learning algorithm C4.5 and CART, the classical machine learning foreign language information, the information described in detail, easy to learn from everyone.
Platform: | Size: 727040 | Author: zhongrui | Hits:

[AI-NN-PRDTree

Description: 一个实现分类决策树算法的系统。ID3算法和C4.5算法。-A decision tree algorithm to achieve classification system. ID3 and C4.5 algorithms.
Platform: | Size: 245760 | Author: 董旭 | Hits:

[AI-NN-PRC4_5

Description: 是一个介绍及其学习算法C4.5的文档,里面讲的比较详细-Is an introduction to its learning algorithm C4.5 of the document, which the more detailed
Platform: | Size: 268288 | Author: 杨啸晗 | Hits:

[Program docadaboos

Description: 当弱分类器算法使用简单的分类方时,boosting的效果明显地统一地比bagging要好.当弱分类器算法使用C4.5时,boosting比bagging较好,但是没有前者的比较来得明显.-When the weak classifier algorithm using simple classification method, boosting the effect clearly uniformly better than bagging. When the weak classifier algorithm C4.5 when used, boosting ratio bagging better, but without the former Comparative more obvious.
Platform: | Size: 152576 | Author: 王孟贤 | Hits:

[AlgorithmJ48

Description: J48算法源代码,WEKA,C4.5算法源代码-J48 algorithm source code
Platform: | Size: 5120 | Author: xiali | Hits:

[AI-NN-PRJAVA-decisiontree

Description: 本程序由Java编写,运行前请确认您的电脑上已安装JDK1.7或以上版本并配置好JDK的系统环境变量。请使用Eclipse集成开发导入源代码-The Algorithm of Decision Trees: ID3 and C4.5
Platform: | Size: 596992 | Author: 杨伟 | Hits:

[Mathimatics-Numerical algorithmstree

Description: 数据挖掘-决策树-c4.5算法的java代码实现-Data Mining- Decision Tree algorithm java code-c4.5
Platform: | Size: 928768 | Author: 王倩 | Hits:

[Delphi VCLEncDecC1C2C3C4

Description: 奇迹MU C1 C2 C3 C4 封包算法老版本-MU miracle packet algorithm EncDecC1C2C3C4.pas
Platform: | Size: 3072 | Author: q1w2e3e3w2q1 | Hits:

[AI-NN-PRc45

Description: 决策树算法c4.5 C语言实现以及命名规则和构建工具开发集合-C4.5 decision tree algorithm C language, naming and build tools set
Platform: | Size: 154624 | Author: | Hits:

[Software Engineeringadaboost

Description: Now, you ought to implement the AdaBoost.M1 and AdaBoost.M2 algorithms. These algorithms are two versions of the AdaBoost algorithm for handling the Problems with more than two classes. You must first read the paper “Experiments with a New Boosting Algorithm”. Use decision stump and C4.5 classifiers of Weka as the base classifiers for AdaBoost.M1 and use decision stump as the base classifier for AdaBoost.M2.
Platform: | Size: 432128 | Author: hajar | Hits:

[Mathimatics-Numerical algorithms决策树与随机森林

Description: 给出对决策树与随机森林的认识。主要分析决策树的学习算法:信息增益和ID3、C4.5、CART树,然后给出随机森林。 决策树中,最重要的问题有3个: 1. 特征选择。即选择哪个特征作为某个节点的分类特征; 2. 特征值的选择。即选择好特征后怎么划分子树; 3. 决策树出现过拟合怎么办? 下面分别就以上问题对决策树给出解释。决策树往往是递归的选择最优特征,并根据该特征对训练数据进行分割。(The understanding of decision tree and random forest is given. This paper mainly analyzes the learning algorithm of decision tree: information gain and ID3, C4.5, CART tree, and then give the random forest. Among the decision trees, there are 3 of the most important issues. 1. feature selection. Which is to choose which feature as the classification of a node; 2. the selection of eigenvalues. That is, how to divide the subtrees after the selection of the good features. 3. how to do the fitting of the decision tree? The following questions are explained on the decision tree respectively. The decision tree is often the optimal feature of the recursive selection, and the training data are segmented according to the feature.)
Platform: | Size: 2114560 | Author: ZJN27 | Hits:

[OthertreePlotter

Description: 绘制ID3,C4.5,CRAT决策树算法的树型(Drawing the tree pattern of ID3, C4.5, CRAT decision tree algorithm)
Platform: | Size: 1024 | Author: zhaoliang123 | Hits:
« 1 2 ... 4 5 6 7 8 9»

CodeBus www.codebus.net